Parallelizing MCMC via Weierstrass Sampler
نویسندگان
چکیده
With the rapidly growing scales of statistical problems, subset based communicationfree parallel MCMC methods are a promising future for large scale Bayesian analysis. In this article, we propose a new Weierstrass sampler for parallel MCMC based on independent subsets. The new sampler approximates the full data posterior samples via combining the posterior draws from independent subset MCMC chains, and thus enjoys a higher computational efficiency. We show that the approximation error for the Weierstrass sampler is bounded by some tuning parameters and provide suggestions for choice of the values. Simulation study shows the Weierstrass sampler is very competitive compared to other methods for combining MCMC chains generated for subsets, including averaging and kernel smoothing.
منابع مشابه
Learning Deep Generative Models with Doubly Stochastic MCMC
We present doubly stochastic gradient MCMC, a simple and generic method for (approximate) Bayesian inference of deep generative models in the collapsed continuous parameter space. At each MCMC sampling step, the algorithm randomly draws a minibatch of data samples to estimate the gradient of log-posterior and further estimates the intractable expectation over latent variables via a Gibbs sample...
متن کاملPseudo-extended Markov chain Monte Carlo
Sampling from the posterior distribution using Markov chain Monte Carlo (MCMC) methods can require an exhaustive number of iterations to fully explore the correct posterior. This is often the case when the posterior of interest is multi-modal, as the MCMC sampler can become trapped in a local mode for a large number of iterations. In this paper, we introduce the pseudo-extended MCMC method as a...
متن کاملDistributed Bayesian Posterior Sampling via Moment Sharing
We propose a distributed Markov chain Monte Carlo (MCMC) inference algorithm for large scale Bayesian posterior simulation. We assume that the dataset is partitioned and stored across nodes of a cluster. Our procedure involves an independent MCMC posterior sampler at each node based on its local partition of the data. Moment statistics of the local posteriors are collected from each sampler and...
متن کاملGeneralised linear mixed model analysis via sequential Monte Carlo sampling
We present a sequential Monte Carlo sampler algorithm for the Bayesian analysis of generalised linear mixed models (GLMMs). These models support a variety of interesting regression-type analyses, but performing inference is often extremely difficult, even when using the Bayesian approach combined with Markov chain Monte Carlo (MCMC). The Sequential Monte Carlo sampler (SMC) is a new and general...
متن کاملFast Bayesian reconstruction of chaotic dynamical systems via extended Kalman filtering.
We present an improved Markov chain Monte Carlo (MCMC) algorithm for posterior computation in chaotic dynamical systems. Recent Bayesian approaches to estimate the parameters of chaotic maps have used the Gibbs sampler which exhibits slow convergence due to high posterior correlations. Using the extended Kalman filter to compute the likelihood function by integrating out all unknown system stat...
متن کامل